Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech
نویسندگان
چکیده
Previous research has suggested that distributional learning mechanisms may contribute to the acquisition of semantic knowledge. However, distributional learning mechanisms, statistical learning, and contemporary "deep learning" approaches have been criticized for being incapable of learning the kind of abstract and structured knowledge that many think is required for acquisition of semantic knowledge. In this paper, we show that recurrent neural networks, trained on noisy naturalistic speech to children, do in fact learn what appears to be abstract and structured knowledge. We trained two types of recurrent neural networks (Simple Recurrent Network, and Long Short-Term Memory) to predict word sequences in a 5-million-word corpus of speech directed to children ages 0-3 years old, and assessed what semantic knowledge they acquired. We found that learned internal representations are encoding various abstract grammatical and semantic features that are useful for predicting word sequences. Assessing the organization of semantic knowledge in terms of the similarity structure, we found evidence of emergent categorical and hierarchical structure in both models. We found that the Long Short-term Memory (LSTM) and SRN are both learning very similar kinds of representations, but the LSTM achieved higher levels of performance on a quantitative evaluation. We also trained a non-recurrent neural network, Skip-gram, on the same input to compare our results to the state-of-the-art in machine learning. We found that Skip-gram achieves relatively similar performance to the LSTM, but is representing words more in terms of thematic compared to taxonomic relations, and we provide reasons why this might be the case. Our findings show that a learning system that derives abstract, distributed representations for the purpose of predicting sequential dependencies in naturalistic language may provide insight into emergence of many properties of the developing semantic system.
منابع مشابه
Prosodic Features from Large Corpora of Child-Directed Speech as Predictors of the Age of Acquisition of Words
The impressive ability of children to acquire language is a widely studied phenomenon, and the factors influencing the pace and patterns of word learning remains a subject of active research. Although many models predicting the age of acquisition of words have been proposed, little emphasis has been directed to the raw input children achieve. In this work we present a comparatively large-scale ...
متن کاملStatistical Speech Segmentation and Word Learning in Parallel: Scaffolding from Child-Directed Speech
In order to acquire their native languages, children must learn richly structured systems with regularities at multiple levels. While structure at different levels could be learned serially, e.g., speech segmentation coming before word-object mapping, redundancies across levels make parallel learning more efficient. For instance, a series of syllables is likely to be a word not only because of ...
متن کاملAutomatically deriving structured knowledge bases from on-line dictionaries1
keywords: computational lexicography; lexical knowledge bases We describe an automated strategy which exploits on-line dictionaries to construct a richly-structured lexical knowledge base. In particular, we show how the Longman Dictionary of Contemporary English (LDOCE) can be used to build a directed graph which captures semantic associations between words. The result is a huge and highly inte...
متن کاملSemantics at Scale: When Distributional Semantics meets Logic Programming
Distributional semantic models (DSMs) are semantic models which are automatically built from co-occurrence patterns in unstructured text. These semantic models trade representation structure for volume of semantic and commonsense knowledge, and provide effective large-scale semantic models which can be used to complement logical knowledge bases. DSMs can be used to inject large scale commonsens...
متن کاملPsychometric Properties of the Persian Word Pairs Task for Declarative Memory Assessment
Objective: According to the declarative/procedural model, the semantic aspect of language depends on the brain structures responsible for declarative memory. The word pairs task is a common tool for evaluating declarative memory. The current study aimed to design a valid and reliable task for evaluating declarative memory in Persian children at learning and retention stages and to investigate i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره 9 شماره
صفحات -
تاریخ انتشار 2018